Kernel Sliced Inverse Regression: Regularization and Consistency
نویسندگان
چکیده
منابع مشابه
Consistency of regularized sliced inverse regression for kernel models
We develop an extension of the sliced inverse regression (SIR) framework for dimension reduction using kernel models and Tikhonov regularization. The result is a numerically stable nonlinear dimension reduction method. We prove consistency of the method under weak conditions even when the reproducing kernel Hilbert space induced by the kernel is infinite dimensional. We illustrate the utility o...
متن کاملLocalized Sliced Inverse Regression
We developed localized sliced inverse regression for supervised dimension reduction. It has the advantages of preventing degeneracy, increasing estimation accuracy, and automatic subclass discovery in classification problems. A semisupervised version is proposed for the use of unlabeled data. The utility is illustrated on simulated as well as real data sets.
متن کاملStudent Sliced Inverse Regression
Sliced Inverse Regression (SIR) has been extensively used to reduce the dimension of the predictor space before performing regression. SIR is originally a model free method but it has been shown to actually correspond to the maximum likelihood of an inverse regression model with Gaussian errors. This intrinsic Gaussianity of standard SIR may explain its high sensitivity to outliers as observed ...
متن کاملAsymptotics of Sliced Inverse Regression
Sliced Inverse Regression is a method for reducing the dimension of the explanatory variables x in non-parametric regression problems. Li (1991) discussed a version of this method which begins with a partition of the range of y into slices so that the conditional covariance matrix of x given y can be estimated by the sample covariance matrix within each slice. After that the mean of the conditi...
متن کاملGaussian Regularized Sliced Inverse Regression
Sliced Inverse Regression (SIR) is an effective method for dimension reduction in high-dimensional regression problems. The original method, however, requires the inversion of the predictors covariance matrix. In case of collinearity between these predictors or small sample sizes compared to the dimension, the inversion is not possible and a regularization technique has to be used. Our approach...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Abstract and Applied Analysis
سال: 2013
ISSN: 1085-3375,1687-0409
DOI: 10.1155/2013/540725